Robust regression in RKHS - An overview

نویسندگان

  • George Papageorgiou
  • Pantelis Bouboulis
  • Sergios Theodoridis
چکیده

The paper deals with the task of robust nonlinear regression in the presence of outliers. The problem is dealt in the context of reproducing kernel Hilbert spaces (RKHS). In contrast to more classical approaches, a recent trend is to model the outliers as a sparse vector noise component and mobilize tools from the sparsity-aware/compressed sensing theory to impose sparsity on it. In this paper, three of the most popular approaches are considered and compared. These represent three major directions in sparsity-aware learning context; that is, a) a greedy approach b) a convex relaxation of the sparsitypromoting task via the l1 norm-based regularization of the least-squares cost and c) a Bayesian approach making use of appropriate priors, associated with the involved parameters.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Unified and Comprehensible View of Parametric and Kernel Methods for Genomic Prediction with Application to Rice

One objective of this study was to provide readers with a clear and unified understanding of parametric statistical and kernel methods, used for genomic prediction, and to compare some of these in the context of rice breeding for quantitative traits. Furthermore, another objective was to provide a simple and user-friendly R package, named KRMM, which allows users to perform RKHS regression with...

متن کامل

Multivariate Bayesian Kernel Regression Model for High Dimensional Data and its Practical Applications in Near Infrared (NIR) Spectroscopy

Non-linear regression based on reproducing kernel Hilbert space (RKHS) has recently become very popular in fitting high-dimensional data. The RKHS formulation provides an automatic dimension reduction of the covariates. This is particularly helpful when the number of covariates ($p$) far exceed the number of data points. In this paper, we introduce a Bayesian nonlinear multivariate regression m...

متن کامل

Feature-to-Feature Regression for a Two-Step Conditional Independence Test

The algorithms for causal discovery and more broadly for learning the structure of graphical models require well calibrated and consistent conditional independence (CI) tests. We revisit the CI tests which are based on two-step procedures and involve regression with subsequent (unconditional) independence test (RESIT) on regression residuals and investigate the assumptions under which these tes...

متن کامل

Nonparametric methods for incorporating genomic information into genetic evaluations: an application to mortality in broilers.

Four approaches using single-nucleotide polymorphism (SNP) information (F(infinity)-metric model, kernel regression, reproducing kernel Hilbert spaces (RKHS) regression, and a Bayesian regression) were compared with a standard procedure of genetic evaluation (E-BLUP) of sires using mortality rates in broilers as a response variable, working in a Bayesian framework. Late mortality (14-42 days of...

متن کامل

Technical Report: Using Laplacian Methods, RKHS Smoothing Splines and Bayesian Estimation as a framework for Regression on Graph and Graph Related Domains

1 Laplacian Methods: An Overview 2 1.1 De…nition: The Laplacian operator of a Graph . . . . . . . . . . 2 1.2 Properties of the Laplacian and its Spectrum . . . . . . . . . . . 4 1.2.1 Spectrum of L and e L: Graph eigenvalues and eigenvectors: 4 1.2.2 Other interesting / useful properties of the normalized Laplacian (Chung): . . . . . . . . . . . . . . . . . . . . . 6 1.2.3 Laplacians of Weight...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015